Sharp Inequalities for $f$-divergences
نویسندگان
چکیده
f -divergences are a general class of divergences between probability measures which include as special cases many commonly used divergences in probability, mathematical statistics and information theory such as Kullback-Leibler divergence, chi-squared divergence, squared Hellinger distance, total variation distance etc. In this paper, we study the problem of maximizing or minimizing an f -divergence between two probability measures subject to a finite number of constraints on other f -divergences. We show that these infinite-dimensional optimization problems can all be reduced to optimization problems over small finite dimensional spaces which are tractable. Our results lead to a comprehensive and unified treatment of the problem of obtaining sharp inequalities between f -divergences. We demonstrate that many of the existing results on inequalities between f -divergences can be obtained as special cases of our results and we also improve on some existing non-sharp inequalities.
منابع مشابه
A Class of New Metrics Based on Triangular Discrimination
In the field of information theory, statistics and other application areas, the information-theoretic divergences are used widely. To meet the requirement of metric properties, we introduce a class of new metrics based on triangular discrimination which are bounded. Moreover, we obtain some sharp inequalities for the triangular discrimination and other information-theoretic divergences. Their a...
متن کاملMixed f - divergence and inequalities for log concave functions ∗
Mixed f -divergences, a concept from information theory and statistics, measure the difference between multiple pairs of distributions. We introduce them for log concave functions and establish some of their properties. Among them are affine invariant vector entropy inequalities, like new Alexandrov-Fenchel type inequalities and an affine isoperimetric inequality for the vector form of the Kull...
متن کاملMathematical inequalities for some divergences
X iv :1 10 4. 56 03 v2 [ co nd -m at .s ta tm ec h] 1 8 O ct 2 01 1 Mathematical inequalities for some divergences S. Furuichiand F.-C. Mitroi Department of Computer Science and System Analysis, College of Humanities and Sciences, Nihon University, 3-25-40, Sakurajyousui, Setagaya-ku, Tokyo, 156-8550, Japan University of Craiova, Department of Mathematics, Street A. I. Cuza 13, Craiova, RO-2005...
متن کاملZipf–Mandelbrot law, f-divergences and the Jensen-type interpolating inequalities
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf-Mandelbrot law applied to various types of f-divergences and distances, such are Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), [Formula: see text]-divergence, total variation ...
متن کاملOn Pairs of f-divergences and their Joint Range
I. DIVERGENCES AND DIVERGENCE STATISTICS MANY of the divergence measures used in statistics are of the f -divergence type introduced independently by I. Csiszár [1], T. Morimoto [2], and Ali and Silvey [3]. Such divergence measures have been studied in great detail in [4]. Often one is interested inequalities for one f -divergence in terms of another f -divergence. Such inequalities are for ins...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 60 شماره
صفحات -
تاریخ انتشار 2014